200 research outputs found
HEIDE: An IDE for the Homomorphic Encryption Library HElib
Work in the field of Homomorphic Encryption has exploded in the past 5 years, after Craig Gentry proposed the first encryption scheme capable of performing Homomorphic Encryption. Under the scheme one can encrypt data, perform computations on the encrypted result (without needing the original data), and then decrypt the data to get the result as if the computations had been run on the unencrypted data.
Such a scheme has wide reaching implications for cloud computing. Computations on sensitive data, just like regular data, could now be performed in the cloud with the added security that even the cloud service provider couldn\u27t see the secure data. With such a benefit one might ask why the encryption scheme is not used currently? It is because, while Craig Gentry\u27s scheme was theoretically sound, it was not quick. As such, recent work has been in finding ways to speed up the scheme. Several improvements in speed have been made and several implementations of those improved schemes have been developed: one being HElib.
As of now HElib is self described as an assembly language for HE . Our work focused on creating HEIDE, a Homomorphic Encryption IDE, where researchers could write tests at a high-level. This high-level code is then compiled into the operations provided by HElib. HElib, like most encryption schemes, can be configured using different setup parameters. These parameters change the run-time and security of the scheme. As such we have also provided an easy way for researchers to simultaneously run their tests using different setup parameters. To support that, timing and memory metrics are provided for each test so that researchers can determine which parameters worked best
Recommended from our members
From bench to bedside: Tracing the payback forwards from basic or early clinical research – A preliminary exercise and proposals for a future study
EXECUTIVE SUMMARY
Chapter 1 : Introduction
• The members of the research team from HERG and the Wellcome Trust have conducted previous studies showing that it is possible both to assess the payback from applied health research, and to use bibliometrics to trace the links between generations of research and clinical guidelines. In another of the team’s studies, however, it proved difficult to replicate the major study by Comroe and Dripps (1976) that had identified clinical advances and then worked backwards to show that they had relied on earlier basic research. Therefore, the study reported here sets out to use the methods developed in our previous studies of payback to undertake analysis that starts with more basic or early clinical research and traces the research lines forwards to clinical applications. Whilst this preliminary study involved preparation for a future large-scale study, it was hoped that it would also provide an interesting case study.
• Starting with the research outputs of one team 20 years ago, called the 1st generation papers, the preliminary study has three main elements: standard bibliometric analysis through several generations of papers; categorisation of the citations; and qualitative analysis using questionnaires, critical pathway analysis and interviews to trace the impact of the 1st generation of research.
• Diabetes and cardiology were suggested as possible topics on which to base the study. Initial reviews identified two bodies of research in diabetes as being potentially suitable for reasons such as the continuing activity of key members of the team.
• The research into diabetes conducted in 1981 by George Alberti and his team at Newcastle, and collaborators elsewhere, was selected to provide the case study for this preliminary stage for several reasons. It was thought to have been important science and there was a belief that some of it had made a contribution to clinical practice.
Chapter 2 : Bibliometric analysis
• An original plan to look at publications produced over a three year period was changed to looking at the output of just one year, 1981, because in that year alone Alberti and colleagues published 29 articles. These form the 1st generation papers and the average number of citations they received is high. Identifying the citations given to these 29 papers resulted in 799 2nd generation papers and 12,891 3rd generation papers. The numbers involved meant that it was impractical to go beyond the 3rd generation. Within the high overall average, the variation in the number of citations per paper was iii
considerable going from 76 to just one. Similarly, the half-lives of the 29 papers, ie the time taken for an article to receive 50% of its citations, ranged from two years to 11.
• Articles can be given a Research Level (ie one of four levels from clinical observation to basic) based on the journals in which they appear. Such analysis demonstrates the breadth of Alberti’s work because the 29 articles are spread across all four Research Levels. Crucially, there was not a shift from basic to more clinical levels across the generations. The higher than average number of authors and addresses per paper is testimony to Alberti’s extensive collaborations.
• The funding acknowledgements reveal the high proportion of papers supported, at least partially, by one funder: the British Diabetic Association, now Diabetes UK, which provided core support for Alberti’s Newcastle team.
Chapter 3 : Categorisation of citations
• Traditional citation analysis does not allow identification of the importance of the cited article to the citing article, and therefore limits the ability to use citation analysis to trace the impact of basic or early research on later research. We conducted a review of the literature of the meaning of citations.
• From this review, a template was devised that allowed the location, nature and importance of citations to be recorded as well as the type of research (basic or clinical) described in the paper. This was used by six assessors on a sample of papers and inter-rater reliability was tested. Further work is required to refine the template and its definitions, and to improve its consistency in application.
• Nevertheless, for initial analysis, it was applied to 623 out of the 799 2nd generation papers. A four point scale was used for the importance of the cited paper to the citing paper. In just 9% of cases was the cited 1st generation paper thought to be in one of the top two categories, ie of Considerable or Essential importance to the citing paper.
• Statistical analysis revealed no relationship between the number of citations a paper received and the proportion of citations where the cited paper was classified as being of high (ie. Considerable or Essential) importance to the citing paper. Self-citations, however, were shown to be significantly more likely to be in this category.
• The classification of the type of research (basic or clinical) by our analysis of each paper broadly agreed with the classification of the journals by Research Level.
• The time constraints involved in applying the template, plus the lack of any overall pattern in terms of correlations between number and importance of citations, might point to the desirability of adopting a more selective approach, guided by qualitative analysis. In any selective approach, however, it is likely that self-citations should feature.
iv
Chapter 4 : Qualitative analysis
• Given the number of co-authors, it seemed appropriate to send them a questionnaire rather than attempt to interview them. Therefore the interviewing was rather more concentrated than originally intended. Only one formal critical pathway was created, but it was undertaken by an expert in the field who worked with Alberti at Newcastle.
• Some problems emerged in taking 1981 as the starting point for the study. Alberti identified 10 selected papers from the 1970s and 1980s that he felt had had most impact on clinical practice. These helped to give us both a better understanding of the payback from our 1st generation, or 1981, papers, and provided further material for analysis.
• Attempting to describe the impact from the 1981 body of work, and from the 10 selected papers, underlines the complex reality of how science advances and influences clinical practice. If they make a contribution at all, most studies make a small, incremental one.
• A few papers, however, have been shown to have a considerably greater impact. A possible key to the level of payback indicated is the enormous breadth of Alberti’s contacts, and fields and methods of working, to which various references were made. This is well illustrated in the account of how the idea for subcutaneous pumps came about. Similarly, the ability to produce the very important guidelines on treating diabetics during surgery, and diabetic coma, partly resulted from the application to clinical problems of the understandings gained from some of the basic/early clinical studies. It is significant that the key papers on these issues, all of which come from the list of 10 selected papers from the 1970s and 1980s, were having an impact on the 1981 work.
• How far the collection of papers from 1981 have been drawn upon in similar ways is less clear. Nevertheless, papers on treating diabetics during open heart surgery, and on bolus delivery of insulin at meal times, were key parts of these wider streams, despite variable citation levels. Furthermore, various papers, including on acarbose, on portal infusion of insulin, and on semi-human insulin, were important steps in bodies of work in their respective areas. The complexity was illustrated by a paper that helped debunk the Chlorpropamide alcohol flushing hypothesis, and thus end a line of scientific enquiry: there was payback in stopping an incorrect line of inquiry, but nothing on which to build.
• Each technique in the qualitative study produced information about the successful subsequent careers followed by many researchers trained through working with Alberti.
• Historical perspectives, and insider expert opinions, were important in the qualitative analysis. Overall, the qualitative methods highlighted some limitations in the bibliometric approach but also showed how aspects of the citation analysis can complement the opinions expressed, for example about the importance of the breadth of Alberti’s work.
v
Chapter 5 : Lessons learnt and the way forward
• Lessons learnt: a variety of methods can be used successfully to gather considerable data about the payback from a body of research undertaken 20 years ago. Traditional citation analysis alone, however, is not sufficient: the importance of the surgery papers despite their relatively low citation rates illustrates this. The qualitative methods are important and much of the analysis is strengthened by drawing on multiple approaches. Several problems remain, including: identifying a coherent starting point for the analysis; coping with the enormous number of papers involved in later generations; and refining the template for categorising citations and developing ways of fully utilising the results from applying it.
• Preparing for the large-scale study: this preliminary study provides a basis on which to attempt to undertake the larger study we envisaged. Issues now being addressed include identification of the level of bibliometric/citation analysis necessary to complement any qualitative studies. To provide confidence in the findings from an eventual large-scale study, we will need to expand the focus. The study will need to cover at least four sets of case studies. Ideally, each set should focus on a number of research groups working in a country in the same field. We hope there will be sets of case studies in two or three fields and in at least two countries. The issues to be explored will include ones highlighted by this study such as breadth of work, level of collaboration, and the role of core funding.
• Methods for the large-scale study: for each case study we now propose to employ two methodological elements based on the qualitative and quantitative techniques adopted in the preliminary study. They will work in parallel but the quantitative bibliometric analysis would be applied selectively to parts of ‘research lines’ (ie discrete themes of research) identified in the qualitative studies as being important in influencing clinical practice.
• Presenting the findings: each research line could be written-up in a standardised document that would use the HERG payback model and categories to describe the impact of that research. We shall use the qualitative and quantitative data to compare and contrast the ‘payback’ of research lines by country and disease, and then identify common factors that correlate with the translation of basic or early clinical research.
• Concluding comments: in the era of ‘evidence based policy’, research funders are looking for value for money in the research they support and for evidence on the effectiveness of different research strategies. In this study we have begun developing a methodology that will allow us to understand the complexity of research development over a series of generations. The utility of the policy research we propose here will only be realised when it is scaled up to cover a number of different fields in different settings.NHS Executive, London Regio
Constraints on Nucleon Decay via "Invisible" Modes from the Sudbury Neutrino Observatory
Data from the Sudbury Neutrino Observatory have been used to constrain the
lifetime for nucleon decay to ``invisible'' modes, such as n -> 3 nu. The
analysis was based on a search for gamma-rays from the de-excitation of the
residual nucleus that would result from the disappearance of either a proton or
neutron from O16. A limit of tau_inv > 2 x 10^{29} years is obtained at 90%
confidence for either neutron or proton decay modes. This is about an order of
magnitude more stringent than previous constraints on invisible proton decay
modes and 400 times more stringent than similar neutron modes.Comment: Update includes missing efficiency factor (limits change by factor of
2) Submitted to Physical Review Letter
First Neutrino Observations from the Sudbury Neutrino Observatory
The first neutrino observations from the Sudbury Neutrino Observatory are
presented from preliminary analyses. Based on energy, direction and location,
the data in the region of interest appear to be dominated by 8B solar
neutrinos, detected by the charged current reaction on deuterium and elastic
scattering from electrons, with very little background. Measurements of
radioactive backgrounds indicate that the measurement of all active neutrino
types via the neutral current reaction on deuterium will be possible with small
systematic uncertainties. Quantitative results for the fluxes observed with
these reactions will be provided when further calibrations have been completed.Comment: Latex, 7 pages, 10 figures, Invited paper at Neutrino 2000
Conference, Sudbury, Canada, June 16-21, 2000 to be published in the
Proceeding
Measurement of the Total Active 8B Solar Neutrino Flux at the Sudbury Neutrino Observatory with Enhanced Neutral Current Sensitivity
The Sudbury Neutrino Observatory (SNO) has precisely determined the total
active (nu_x) 8B solar neutrino flux without assumptions about the energy
dependence of the nu_e survival probability. The measurements were made with
dissolved NaCl in the heavy water to enhance the sensitivity and signature for
neutral-current interactions. The flux is found to be 5.21 +/- 0.27 (stat) +/-
0.38 (syst) x10^6 cm^{-2}s^{-1}, in agreement with previous measurements and
standard solar models. A global analysis of these and other solar and reactor
neutrino results yields Delta m^{2} = 7.1^{+1.2}_{-0.6}x10^{-5} ev^2 and theta
= 32.5^{+2.4}_{-2.3} degrees. Maximal mixing is rejected at the equivalent of
5.4 standard deviations.Comment: Submitted to Phys. Rev. Let
Electron Antineutrino Search at the Sudbury Neutrino Observatory
Upper limits on the \nuebar flux at the Sudbury Neutrino Observatory have
been set based on the \nuebar charged-current reaction on deuterium. The
reaction produces a positron and two neutrons in coincidence. This distinctive
signature allows a search with very low background for \nuebar's from the Sun
and other potential sources. Both differential and integral limits on the
\nuebar flux have been placed in the energy range from 4 -- 14.8 MeV. For an
energy-independent \nu_e --> \nuebar conversion mechanism, the integral limit
on the flux of solar \nuebar's in the energy range from 4 -- 14.8 MeV is found
to be \Phi_\nuebar <= 3.4 x 10^4 cm^{-2} s^{-1} (90% C.L.), which corresponds
to 0.81% of the standard solar model 8B \nu_e flux of 5.05 x 10^6 cm^{-2}
s^{-1}, and is consistent with the more sensitive limit from KamLAND in the 8.3
-- 14.8 MeV range of 3.7 x 10^2 cm^{-2} s^{-1} (90% C.L.). In the energy range
from 4 -- 8 MeV, a search for \nuebar's is conducted using coincidences in
which only the two neutrons are detected. Assuming a \nuebar spectrum for the
neutron induced fission of naturally occurring elements, a flux limit of
Phi_\nuebar <= 2.0 x 10^6 cm^{-2} s^{-1}(90% C.L.) is obtained.Comment: submitted to Phys. Rev.
Measurement of the rate of nu_e + d --> p + p + e^- interactions produced by 8B solar neutrinos at the Sudbury Neutrino Observatory
Solar neutrinos from the decay of B have been detected at the Sudbury
Neutrino Observatory (SNO) via the charged current (CC) reaction on deuterium
and by the elastic scattering (ES) of electrons. The CC reaction is sensitive
exclusively to nu_e's, while the ES reaction also has a small sensitivity to
nu_mu's and nu_tau's. The flux of nu_e's from ^8B decay measured by the CC
reaction rate is
\phi^CC(nu_e) = 1.75 +/- 0.07 (stat)+0.12/-0.11 (sys.) +/- 0.05(theor) x 10^6
/cm^2 s.
Assuming no flavor transformation, the flux inferred from the ES reaction
rate is
\phi^ES(nu_x) = 2.39+/-0.34 (stat.)+0.16}/-0.14 (sys) x 10^6 /cm^2 s.
Comparison of \phi^CC(nu_e) to the Super-Kamiokande Collaboration's precision
value of \phi^ES(\nu_x) yields a 3.3 sigma difference, providing evidence that
there is a non-electron flavor active neutrino component in the solar flux. The
total flux of active ^8B neutrinos is thus determined to be 5.44 +/-0.99 x
10^6/cm^2 s, in close agreement with the predictions of solar models.Comment: 6 pages (LaTex), 3 figures, submitted to Phys. Rev. Letter
Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy
Background
A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets.
Methods
Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis.
Results
A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001).
Conclusion
We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
- …